Ranking with Large Margin Principle: Two Approaches
نویسندگان
چکیده
We discuss the problem of ranking k instances with the use of a “large margin” principle. We introduce two main approaches: the first is the “fixed margin” policy in which the margin of the closest neighboring classes is being maximized — which turns out to be a direct generalization of SVM to ranking learning. The second approach allows for k different margins where the sum of margins is maximized. This approach is shown to reduce to -SVM when the number of classes k . Both approaches are optimal in size of l where l is the total number of training examples. Experiments performed on visual classification and “collaborative filtering” show that both approaches outperform existing ordinal regression algorithms applied for ranking and multi-class SVM applied to general multi-class classification.
منابع مشابه
Taxonomy of Large Margin Principle Algorithms for Ordinal Regression Problems
We discuss the problem of ranking instances where an instance is associated with an integer from 1 to k. In other words, the specialization of the general multi-class learning problem when there exists an ordering among the instances — a problem known as “ordinal regression” or “ranking learning”. This problem arises in various settings both in visual recognition and other information retrieval...
متن کاملEfficient Margin-Based Rank Learning Algorithms for Information Retrieval
Learning a good ranking function plays a key role for many applications including the task of (multimedia) information retrieval. While there are a few rank learning methods available, most of them need to explicitly model the relations between every pair of relevant and irrelevant documents, and thus result in an expensive training process for large collections. The goal of this paper is to pr...
متن کاملA Systematic Cross-Comparison of Sequence Classifiers
In the CoNLL 2003 NER shared task, more than two thirds of the submitted systems used a feature-rich representation of the task. Most of them used the maximum entropy principle to combine the features together. Others used large margin linear classifiers, such as SVM and RRM. In this paper, we compare several common classifiers under exactly the same conditions, demonstrating that the ranking o...
متن کاملLearning Discriminative Features with Multiple Granularities for Person Re-Identification
The combination of global and partial features has been an essential solution to improve discriminative performances in person re-identification (Re-ID) tasks. Previous part-based methods mainly focus on locating regions with specific pre-defined semantics to learn local representations, which increases learning difficulty but not efficient or robust to scenarios with large variances. In this p...
متن کاملPerturbation based Large Margin Approach for Ranking
The use of the standard hinge loss for structured outputs, for the learning to rank problem, faces two main caveats: (a) the label space, the set of all possible permutations of items to be ranked, is too large, and also less amenable to the usual dynamic-programming based techniques used for structured outputs, and (b) the supervision or training data consists of instances with multiple labels...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2002